Compare Page

Objectivity

Characteristic Name: Objectivity
Dimension: Reliability and Credibility
Description: Data are unbiased and impartial
Granularity: Information object
Implementation Type: Process-based approach
Characteristic Type: Usage

Verification Metric:

The number of tasks failed or under performed due to biased and partial data
The number of complaints received due to biased or partial data

GuidelinesExamplesDefinitons

The implementation guidelines are guidelines to follow in regard to the characteristic. The scenarios are examples of the implementation

Guidelines: Scenario:
Identify all the factors that make a particular data/information biased for the intended use and take preventive actions to eliminate them (1) A written questionnaire is better than a face to face interviews in getting sensitive personal data
Design and execute preventive actions for all possible information distortions (malfunctioning or personal biases) which may cause by information /data collectors Perform a duel coder approach to code qualitative data.
Design and execute preventive actions for all possible information distortions (malfunctioning or personal biases) which may cause by information /data transmitters (1) After a survey is performed, each participant is contacted individually by a party (other than the person who conducted the survey) and randomly verify if the participants real responses have been marked properly.

Validation Metric:

How mature is the process to prevent biased and partial data

These are examples of how the characteristic might occur in a database.

Example: Source:
Consider an inventory database that contains part numbers, warehouse locations, quantity on hand, and other information. However, it does not contain source information (where the parts came from). If a part is supplied by multiple suppliers, once the parts are received and put on the shelf there is no indication of which supplier the parts came from. The information in the database is always accurate and current. For normal inventory transactions and deci- sion making, the database is certainly of high quality. If a supplier reports that one of their shipments contained defective parts, this database is of no help in identifying whether they have any of those parts or not. The database is of poor quality because it does not contain a relevant element of information. Without that information, the database is poor data quality for the intended use. J. E. Olson, “Data Quality: The Accuracy Dimension”, Morgan Kaufmann Publishers, 9 January 2003.

The Definitions are examples of the characteristic that appear in the sources provided.

Definition: Source:
The degree to which Information is presented without bias, enabling the Knowledge Worker to understand the meaning and significance without misinterpretation. ENGLISH, L. P. 2009. Information quality applied: Best practices for improving business information, processes and systems, Wiley Publishing.
Is the information free of distortion, bias, or error? EPPLER, M. J. 2006. Managing information quality: increasing the value of information in knowledge-intensive products and processes, Springer.
1) Data are unbiased and impartial

2) Objectivity is the extent to which data are unbiased (unprejudiced) and impartial.

WANG, R. Y. & STRONG, D. M. 1996. Beyond accuracy: What data quality means to data consumers. Journal of management information systems, 5-33.

 

Statistical validity

Characteristic Name: Statistical validity
Dimension: Validity
Description: Computed data must be statistically valid
Granularity: Information object
Implementation Type: Process-based approach
Characteristic Type: Usage

Verification Metric:

The number of tasks failed or under performed due to lack of statistical validity in data
The number of complaints received due to lack of statistical validity of data

GuidelinesExamplesDefinitons

The implementation guidelines are guidelines to follow in regard to the characteristic. The scenarios are examples of the implementation

Guidelines: Scenario:
Establish the population of interest unambiguously with appropriate justification (maintain documentation) (1) Both credit customers and cash customers are considered for a survey on customer satisfaction.
Establish an appropriate sampling method with appropriate justification (1) Stratified sampling is used to investigate drug preference of the medical officers
Establish statistical validity of samples -avoid over coverage and under coverage (maintain documentation) (1) Samples are taken from all income levels in a survey on vaccination
Maintain consistency of samples in case longitudinal analysis is performed. (Maintain documentation) (1) Same population is used over the time to collect epidemic data for a longitudinal analysis
Ensure that valid statistical methods are used to enable valid inferences about data, valid comparisons of parameters and generalise the findings. (1) Poisson distribution is used to make inferences since data generating events are occurred in a fixed interval of time and/or space
Ensure that the acceptable variations for estimated parameters are established with appropriate justifications (1) 95% confidence interval is used in estimating the mean value
Ensure that appropriate imputation measures are taken to nullify the impact of problems relating to outliers, data collection and data collection procedures and the edit rules are defined and maintained. (1) Incomplete responses are removed from the final data sample

Validation Metric:

How mature is the process to maintain statistical validity of data

These are examples of how the characteristic might occur in a database.

Example: Source:
if a column should contain at least one occurrence of all 50 states, but the column contains only 43 states, then the population is incomplete. Y. Lee, et al., “Journey to Data Quality”, Massachusetts Institute of Technology, 2006.

The Definitions are examples of the characteristic that appear in the sources provided.

Definition: Source:
Coherence of data refers to the internal consistency of the data. Coherence can be evaluated by determining if there is coherence between different data items for the same point in time, coherence between the same data items for different points in time or coherence between organisations or internationally. Coherence is promoted through the use of standard data concepts, classifications and target populations. HIQA 2011. International Review of Data Quality Health Information and Quality Authority (HIQA), Ireland. http://www.hiqa.ie/press-release/2011-04-28-international-review-data-quality.
1) Accuracy in the general statistical sense denotes the closeness of computations or estimates to the exact or true values.

2) Coherence of statistics is their adequacy to be reliably combined in different ways and for various uses.

LYON, M. 2008. Assessing Data Quality ,
Monetary and Financial Statistics.
Bank of England. http://www.bankofengland.co.uk/
statistics/Documents/ms/articles/art1mar08.pdf.